Nonlinear regression model generation using hyperparameter optimization
نویسندگان
چکیده
منابع مشابه
Nonlinear regression model generation using hyperparameter optimization
An algorithm of the inductive model generation and model selection is proposed to solve the problem of automatic construction of regression models. A regression model is an admissible superposition of smooth functions given by experts. Coherent Bayesian inference is used to estimate model parameters. It introduces hyperparameters, which describe the distribution function of the model parameters...
متن کاملApplying Model-Based Optimization to Hyperparameter Optimization in Machine Learning
This talk will cover the main components of sequential modelbased optimization algorithms. Algorithms of this kind represent the state-of-the-art for expensive black-box optimization problems and are getting increasingly popular for hyper-parameter optimization of machine learning algorithms, especially on larger data sets. The talk will cover the main components of sequential model-based optim...
متن کاملPractical Hyperparameter Optimization
Recently, the bandit-based strategy Hyperband (HB) was shown to yield good hyperparameter settings of deep neural networks faster than vanilla Bayesian optimization (BO). However, for larger budgets, HB is limited by its random search component, and BO works better. We propose to combine the benefits of both approaches to obtain a new practical state-of-the-art hyperparameter optimization metho...
متن کاملHyperparameter Search Space Pruning - A New Component for Sequential Model-Based Hyperparameter Optimization
The optimization of hyperparameters is often done manually or exhaustively but recent work has shown that automatic methods can optimize hyperparameters faster and even achieve better nal performance. Sequential model-based optimization (SMBO) is the current state of the art framework for automatic hyperparameter optimization. Currently, it consists of three components: a surrogate model, an ac...
متن کاملHyperparameter Optimization: A Spectral Approach
We give a simple, fast algorithm for hyperparameter optimization inspired by techniques from the analysis of Boolean functions. We focus on the high-dimensional regime where the canonical example is training a neural network with a large number of hyperparameters. The algorithm– an iterative application of compressed sensing techniques for orthogonal polynomials– requires only uniform sampling ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computers & Mathematics with Applications
سال: 2010
ISSN: 0898-1221
DOI: 10.1016/j.camwa.2010.03.021